52 research outputs found

    International Standardization Of Water Information Exchange: Activities Of The WMO/OGC Hydrology Domain Working Group

    Full text link
    Hydrologic information is generated and published by many government, research, commercial and citizen groups around the world. The formats and protocols used to share the data are heterogeneous, with little agreement about semantics of hydrologic measurements, description of hydrologic features, or metadata content. A broad consensus on hydrologic data sharing formats is needed to ensure that the information can be reliably discovered, interpreted, accessed and integrated. This has been the focus of the Hydrology Domain Working Group (Hydro DWG), established in 2009 as a joint working group of the World Meteorological Organisation (WMO) and the Open Geospatial Consortium (OGC). It consists of members from government, research and the commercial sectors and it plays an important role by bringing together organisations to agree on ways to improve our ability to share water information. In Sept. 2012 the OGC adopted ”WaterML2.0 Part 1: Time series” as an OGC Standard. At its 14th Session in Nov. 2012 the WMO Commission for Hydrology adopted a resolution that starts a process to register this standard as a joint WMO/ISO standard. WaterML2.0 is the first international standard for encoding water observation time series, developed by the Hydro DWG and the WaterML2.0 Standards Working Group after several years of specification work and interoperability experiments. Built on widely used OGC and ISO standards, it represents a breakthrough for linking local to global water information sources into large water information networks and enabling efficient analysis and modelling of water data across information sources. WaterML2.0 Part 1 is the first in a series of planned enhancements and extensions. Current work involves standardised communication of descriptions of surface and groundwater features, ratings and gaugings, river cross sections and water quality. It also contributed to the development of a hydrology domain feature model and hydrology vocabularies, which are essential for interoperability

    The INCF Digital Atlasing Program: Report on Digital Atlasing Standards in the Rodent Brain

    Get PDF
    The goal of the INCF Digital Atlasing Program is to provide the vision and direction necessary to make the rapidly growing collection of multidimensional data of the rodent brain (images, gene expression, etc.) widely accessible and usable to the international research community. This Digital Brain Atlasing Standards Task Force was formed in May 2008 to investigate the state of rodent brain digital atlasing, and formulate standards, guidelines, and policy recommendations.

Our first objective has been the preparation of a detailed document that includes the vision and specific description of an infrastructure, systems and methods capable of serving the scientific goals of the community, as well as practical issues for achieving
the goals. This report builds on the 1st INCF Workshop on Mouse and Rat Brain Digital Atlasing Systems (Boline et al., 2007, _Nature Preceedings_, doi:10.1038/npre.2007.1046.1) and includes a more detailed analysis of both the current state and desired state of digital atlasing along with specific recommendations for achieving these goals

    The NIEHS Environmental Health Sciences Data Resource Portal: Placing Advanced Technologies in Service to Vulnerable Communities

    Get PDF
    BACKGROUND: Two devastating hurricanes ripped across the Gulf Coast of the United States during 2005. The effects of Hurricane Katrina were especially severe: The human and environmental health impacts on New Orleans, Louisiana, and other Gulf Coast communities will be felt for decades to come. The Federal Emergency Management Agency (FEMA) estimates that Katrina’s destruction disrupted the lives of roughly 650,000 Americans. Over 1,300 people died. The projected economic costs for recovery and reconstruction are likely to exceed $125 billion. OBJECTIVES: The NIEHS (National Institute of Environmental Health Sciences) Portal aims to provide decision makers with the data, information, and the tools they need to a) monitor human and environmental health impacts of disasters; b) assess and reduce human exposures to contaminants; and c) develop science-based remediation, rebuilding, and repopulation strategies. METHODS: The NIEHS Portal combines advances in geographic information systems (GIS), data mining/integration, and visualization technologies through new forms of grid-based (distributed, web-accessible) cyberinfrastructure. RESULTS: The scale and complexity of the problems presented by Hurricane Katrina made it evident that no stakeholder alone could tackle them and that there is a need for greater collaboration. The NIEHS Portal provides a collaboration-enabling, information-laden base necessary to respond to environmental health concerns in the Gulf Coast region while advancing integrative multidisciplinary research. CONCLUSIONS: The NIEHS Portal is poised to serve as a national resource to track environmental hazards following natural and man-made disasters, focus medical and environmental response and recovery resources in areas of greatest need, and function as a test bed for technologies that will help advance environmental health sciences research into the modern scientific and computing era

    A Relational Model for Environmental and Water Resources Data

    Get PDF
    Environmental observations are fundamental to hydrology and water resources, and the way these data are organized and manipulated either enables or inhibits the analyses that can be performed. The Observations Data Model presented here provides a new and consistent format for the storage and retrieval of point environmental observations in a relational database designed to facilitate integrated analysis of large data sets collected by multiple investigators. Within this data model, observations are stored with sufficient ancillary information (metadata) about the observations to allow them to be unambiguously interpreted and to provide traceable heritage from raw measurements to useable information. The design is based upon a relational database model that exposes each single observation as a record, taking advantage of the capability in relational database systems for querying based upon data values and enabling cross‐dimension data retrieval and analysis. This paper presents the design principles and features of the Observations Data Model and illustrates how it can be used to enhance the organization, publication, and analysis of point observations data while retaining a simple relational format. The contribution of the data model to water resources is that it represents a new, systematic way to organize and share data that overcomes many of the syntactic and semantic differences between heterogeneous data sets, thereby facilitating an integrated understanding of water resources based on more extensive and fully specified information

    Cyberinfrastructure for the digital brain:spatial standards for integrating rodent brain atlases

    Get PDF
    Biomedical research entails capture and analysis of massive data volumes and new discoveries arise from data-integration and mining. This is only possible if data can be mapped onto a common framework such as the genome for genomic data. In neuroscience, the framework is intrinsically spatial and based on a number of paper atlases. This cannot meet today’s data-intensive analysis and integration challenges. A scalable and extensible software infrastructure that is standards based but open for novel data and resources, is required for integrating information such as signal distributions, gene-expression, neuronal connectivity, electrophysiology, anatomy, and developmental processes. Therefore, the International Neuroinformatics Coordinating Facility (INCF) initiated the development of a spatial framework for neuroscience data integration with an associated Digital Atlasing Infrastructure (DAI). A prototype implementation of this infrastructure for the rodent brain is reported here. The infrastructure is based on a collection of reference spaces to which data is mapped at the required resolution, such as the Waxholm Space (WHS), a 3D reconstruction of the brain generated using high-resolution, multi-channel microMRI. The core standards of the digital atlasing service-oriented infrastructure include Waxholm Markup Language (WaxML): XML schema expressing a uniform information model for key elements such as coordinate systems, transformations, points of interest (POI)s, labels, and annotations; and Atlas Web Services: interfaces for querying and updating atlas data. The services return WaxML-encoded documents with information about capabilities, spatial reference systems and structures, and execute coordinate transformations and POI-based requests. Key elements of INCF-DAI cyberinfrastructure have been prototyped for both mouse and rat brain atlas sources, including the Allen Mouse Brain Atlas, UCSD Cell-Centered Database, and Edinburgh Mouse Atlas Project

    A New Technology for Interactive Online Mapping with Vector Markup and XML

    Get PDF
    this paper, a new technology for Internet cartography is demonstrated that uses direct vector rendering in a browser to create highly interactive virtual maps from distributed sources of geographic data. This technology is made possible by the advent of XML (eXtensible Markup Language) and XML applications for 2D vector rendering such as VML (Vector Markup Language) and SVG (Scalable Vector Graphics). AXIOMAP -- Application of XML for Interactive Online Mapping -- is a Web map publishing kit and a customizable virtual map interface that allows for the display and manipulation of multiple point, line and area layers, database query, choropleth mapping, hyperlinking, map labeling and annotation. To render maps in a Web browser (Internet Explorer 5, in the current version), AXIOMAP generates VML shapes "on the fly" from XML-encoded geographic data that can physically reside on different servers. A thin client-side solution, AXIOMAP provides for better interactivity than traditional map server-based approaches. The paper explains the functionality of AXIOMAP, the technology behind it, and presents several applications. A free version of the software can be downloaded from www.elzaresearch.com/landv
    • …
    corecore